Web Survey Bibliography
Many ATE principal investigators (PIs), evaluators, and researchers, use mailed or emailed surveys to help them assess the effectiveness of their work. However, a common problem is low response rate and the potential that it creates for nonresponse bias. This is the bias that occurs when those who do not respond to a survey are different in some systematic way, from those who do answer the survey. If this occurs, serious questions may be raised about the validity of the findings. The purpose of this report is to raise awareness and present ways to address nonresponse problems among those that use surveys in their studies of ATE grants. This includes PIs, evaluators, researchers, and those who may be evaluating the full program. We summarize the research on nonresponse issues, present generally acceptable standards for response rates, offer suggestions on how to increase response rates, describe ways to check for nonresponse bias, and apply these methods to a research study of the Advanced Technological Education (ATE) program sponsored by the National Science Foundation (NSF). We used a mailed survey to gather information about the impact and sustainability of the ATE program. Although we had a high response rate, we decided to check for nonresponse bias. We used three methods to investigate the problem; we compared responders with the total population and compared responders with nonresponders on five background characteristics. We also compared early responders with late responders on two scales, an ATE Impact Scale and an ATE Sustainability Scale. We used late responders as a surrogate for nonresponders. We found a slightly higher response rate for the larger center grants when compared to projects. However, we found no differences in the actual survey responses between early responders and late responders, our proxy for nonresponders. This led us to believe we did not have a nonresponse bias in our results. We believe that our experience will be useful for those doing research on and evaluation
of the ATE program. We hope the suggestions we offer will help improve the validity of their findings.
University of Colorado (abstract) / (full text)
Web survey bibliography - 2013 (465)
- Effects of Response Format on Measurement of Readership; 2013; Thomas, R. K., Cobb, C. L., Baim, J.
- Potential Impact of Modifying the Fielding Time of a Web-Based Survey; 2013; Baum, H. M., Chandonnet, A.
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- One Drink or Two: Does Quantity Depicted in an Image Affect Web Survey Responses?; 2013; Charoenruk, N., Stange, M.
- A Comparison Between Screen/Follow Item Format and Yes/No Item Format on a Multi-Mode Federal Survey; 2013; Hernandez,S. J., Arakelyan, S. N., Welch, V. E.
- Using Multiple Modes in Follow-Up Contacts in Random-Digit Dialing Surveys; 2013; Chowdhury, P. P.
- Tablets and Smartphones and Netbooks, Oh My! Effects of Device Type on Respondent Behavior; 2013; Ross, H., Mendelson, J., Lackey, M.
- Impacts of Unit Nonresponse in a Recontact Study of Youth; 2013; Mendelson, J., Viera Jr., L.
- Multi-Mode Survey Administration: Does Offering Multiple Modes at Once Depress Response Rates?; 2013; Newsome, J., Levin, K., Langetieg, P., Vigil, M., Sebastiani, M.
- Responsive Design for Web Panel Data Collection; 2013; Bianchi, A., Biffignandi, S.
- Utilizing the Web in a Multi-Mode Survey; 2013; Venkataraman, L.
- Changing to a Mixed-Mode Design: The Role of Mode in Respondents' Decisions About Participation...; 2013; Collins, D., Mitchell, Ma., Toomes, M.
- Comparing the Effects of Mode Design on Response Rate, Representativeness, and Cost Per Complete in...; 2013; Tully, R.
- Internet Response for the Decennial Census – 2012 National Census Test; 2013; Reiser, C.
- The Effects of Pushing Web in a Mixed-Mode Establishment Data Collection; 2013; Ellis, C.
- The Effects of Errors in Paradata on Weighting Class Adjustments: A Simulation Study; 2013; West, B. T.
- Using Paradata to Study Response to Within-Survey Requests; 2013; Sakshaug, J. W.
- Paradata for Coverage Research ; 2013; Eckman, S.
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Online Fundraising Essentials, Second Edition; 2013; Stevenson, S. C.
- Tips for Evaluating Online Effectiveness; 2013; Stevenson, S. C.
- The Digital Divide: The internet and social inequality in international perspective; 2013; Ragnedda, M., Muschert, G.
- Survey quality prediction system 2.0; 2013
- Speed (necessarily) doesn't kill: A new way to detect survey satisficing; 2013; Garland, P., Chen, K., Epstein, J., Suh, A.
- Practical tools for designing and weighting survey samples; 2013; Valliant, R. L., Daver, J. A., Kreuter, F.
- Paradata in web surveys; 2013; Callegaro, M.
- Incentive effects; 2013; Goeritz, A.
- A nationwide web-based freight data collection; 2013; Samimi, A., Mohammadian, A., Kawamura, K.
- The E-Interview in Qualitative Research; 2013; Bampton, R., Cowton, C., Downs, Y.
- Methodological Considerations of Qualitative Email Interviews; 2013; Nehls, K.
- Best Practice in Online Survey Research with Sensitive Topics; 2013; Kays, K., Keith, T. L., Broughal, M. T.
- Reducing Response Burden for Enterprises Combining Methods for Data Collection on the Internet; 2013; Vik, T.
- Advancing Research Methods with New Technologies; 2013; Sappleton, N.
- Data Quality in PC and Mobile Web Surveys; 2013; Mavletova, A. M.
- PDAs in socio-economic surveys: instrument bias, surveyor bias or both?; 2013; Escobal, J., Benites, S.
- Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds; 2013; Hasler, B. S., Tuchman, P., Friedman, D.
- Compared to a small, supervised lab experiment, a large, unsupervised web-based experiment on a previously...; 2013; Ryan, R. S., Wilde, M., Crist, S.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Moving an established survey online – or not?; 2013; Barber, T., Chilvers, D., Kaul, S.
- An approach to selecting online respondents; 2013; Terhanian, G.
- By the Numbers: Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.
- Cyborgs vs. Monsters: Assembling Modular Surveys to Create Complete Datasets; 2013; Johnson, E. P., Siluk, L., Tarraf, S.
- Shorter Isn't Always Better; 2013; Burdein, I.
- Internet-Based Recruitment to a Depression Prevention Intervention: Lessons From the Mood Memos Study...; 2013; Morgan, A. J., Jorm, A. F., Mackinnon, A. J.
- A standard for test reliability in group research; 2013; Ellis, J. L.
- Addressing Survey Nonresponse Issues: Implications for ATE Principal Investigators, Evaluators, and...; 2013; Welch, W. W., Barlau, A. N.
- Pros and cons of virtual interviewers – vote in the discussion about surveytainment; 2013; Póltorak, M., Kowalski, J.
- An Assessment of Incentive Versus Survey Length Trade-offs in a Web Survey of Radiologists; 2013; Ziegenfuss, J. Y., Niederhauser, B. D., Kallmes, D., Beebe, T. J.
- Clarifying Categorical Concepts in a Web Survey.; 2013; Redline, C. D.
- Using Online and Paper Surveys - The Effectiveness of Mixed-Mode Methodology for Populations Over 50; 2013; De Bernardo, D. H., Curtis, A.